Skip to content

Conversation

@jsch8q
Copy link

@jsch8q jsch8q commented Jun 30, 2025

In PyTorch >= 2.6, the betas parameter of the Adam optimizer must explicitly be either a pair of floats or a pair of tensors.
(See, for example, PR #134171 in PyTorch official repo.)

It seems like all we need to make the code run properly in such latest versions of PyTorch is this fix, I had no particular problems after making this change.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant